Xlm Roberta Base Intent Twin
MIT
XLM-RoBERTa-base is a multilingual pre-trained model based on the RoBERTa architecture, supporting Russian and English, suitable for text classification tasks.
Text Classification
Transformers Supports Multiple Languages